# Entity-aware Attention
Mluke Large Lite
Apache-2.0
mLUKE is the multilingual extension of LUKE, supporting named entity recognition, relation classification, and question answering tasks in 24 languages
Large Language Model
Transformers Supports Multiple Languages

M
studio-ousia
65
2
Mluke Base Lite
Apache-2.0
mLUKE is a multilingual extension of LUKE, supporting text processing tasks in 24 languages
Large Language Model
Transformers Supports Multiple Languages

M
studio-ousia
153
2
Mluke Base
Apache-2.0
mLUKE is a multilingual extension of LUKE, supporting named entity recognition, relation classification, and question answering tasks in 24 languages.
Large Language Model
Transformers Supports Multiple Languages

M
studio-ousia
64
6
Luke Base
Apache-2.0
LUKE is a Transformer-based pre-trained model specifically designed for words and entities, providing deep contextual representations through entity-aware self-attention mechanisms.
Large Language Model
Transformers English

L
studio-ousia
2,358
21
Luke Large
Apache-2.0
LUKE is a Transformer-based pre-trained model specifically designed for words and entities, providing deep contextual representations through entity-aware self-attention mechanisms.
Large Language Model
Transformers English

L
studio-ousia
1,040
7
Mluke Large
Apache-2.0
mLUKE is the multilingual extension of LUKE, supporting named entity recognition, relation classification, and question answering tasks in 24 languages.
Large Language Model
Transformers Supports Multiple Languages

M
studio-ousia
70
2
Featured Recommended AI Models